Weak Convergence of $k$-NN Density and Regression Estimators with Varying $k$ and Applications

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

k-NN Regression on Functional Data with Incomplete Observations

In this paper we study a general version of regression where each covariate itself is a functional data such as distributions or functions. In real applications, however, typically we do not have direct access to such data; instead only some noisy estimates of the true covariate functions/distributions are available to us. For example, when each covariate is a distribution, then we might not be...

متن کامل

Optimal rates for k-NN density and mode estimation

We present two related contributions of independent interest: (1) high-probability finite sample rates for k-NN density estimation, and (2) practical mode estimators – based on k-NN – which attain minimax-optimal rates under surprisingly general distributional conditions.

متن کامل

Rates of Uniform Consistency for k-NN Regression

We derive high-probability finite-sample uniform rates of consistency for k-NN regression that are optimal up to logarithmic factors under mild assumptions. We moreover show that kNN regression adapts to an unknown lower intrinsic dimension automatically. We then apply the k-NN regression rates to establish new results about estimating the level sets and global maxima of a function from noisy o...

متن کامل

Weak signed Roman k-domination in graphs

Let $kge 1$ be an integer, and let $G$ be a finite and simple graph with vertex set $V(G)$.A weak signed Roman $k$-dominating function (WSRkDF) on a graph $G$ is a function$f:V(G)rightarrow{-1,1,2}$ satisfying the conditions that $sum_{xin N[v]}f(x)ge k$ for eachvertex $vin V(G)$, where $N[v]$ is the closed neighborhood of $v$. The weight of a WSRkDF $f$ is$w(f)=sum_{vin V(G)}f(v)$. The weak si...

متن کامل

k-NN Regression Adapts to Local Intrinsic Dimension

Many nonparametric regressors were recently shown to converge at rates that depend only on the intrinsic dimension of data. These regressors thus escape the curse of dimension when high-dimensional data has low intrinsic dimension (e.g. a manifold). We show that k-NN regression is also adaptive to intrinsic dimension. In particular our rates are local to a query x and depend only on the way mas...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: The Annals of Statistics

سال: 1987

ISSN: 0090-5364

DOI: 10.1214/aos/1176350487